AI puts users in a mental loop: "There's a dark reason"

Among some users who have interacted with human-like chatbots for extended periods, there has recently been a growing belief that artificial intelligence has gained consciousness, uncovered a conspiracy, or developed new scientific theories.
Such delusions have led to serious consequences, ranging from divorce to homelessness and forced medical treatments, and even death. According to experts, the basis of this situation lies in the conscious design choices made by artificial intelligence systems.
"DARK TRENDS"Features such as "anthropomorphism", which allows chatbots to speak as human-like as possible, and "sycophancy", which constantly justifies the user and supports even unrealistic thoughts, lead users to be drawn further into various delusions.
Anthropologist Webb Keane includes this design approach in the category of “dark patterns” used for manipulative interfaces in the digital world.
DEFENSE FROM OPENAIOpenAI counters these criticisms, arguing that ChatGPT is designed to make life easier for users, "not to consume their attention, but to enable them to use it productively." However, the company's past rushed releases and "bug-fixing" approach reinforce criticism that it effectively involved millions of users in the testing process.
ntv